Learning Conditional Probabilities from Incomplete Data: An Experimental Comparison
نویسندگان
چکیده
This paper compares three methods | em algorithm, Gibbs sampling, and Bound and Collapse (bc) | to estimate conditional probabilities from incomplete databases in a controlled experiment. Results show a substantial equivalence of the estimates provided by the three methods and a dramatic gain in e ciency using bc. Reprinted from: Proceedings of Uncertainty 99: Seventh International Workshop on Arti cial Intelligence and Statistics, Morgan Kaufmann, San Mateo, CA, 1999. Address: Marco Ramoni, Knowledge Media Institute, The Open University, Milton Keynes, United Kingdom MK7 6AA. phone: +44 (1908) 655721, fax: +44 (1908) 653169, email: [email protected], url: http://kmi.open.ac.uk/people/marco. Learning Conditional Probabilities from Incomplete Data: An Experimental Comparison Marco Ramoni Knowledge Media Institute The Open University Paola Sebastiani Statistics Department The Open University Abstract This paper compares three methods | em algorithm, Gibbs sampling, and Bound and Collapse (bc) | to estimate conditional probabilities from incomplete databases in a controlled experiment. Results show a substantial equivalence of the estimates provided by the three methods and a dramatic gain in e ciency using bc.
منابع مشابه
Learning Bayesian Networks from Incomplete Data
Much of the current research in learning Bayesian Networks fails to eeectively deal with missing data. Most of the methods assume that the data is complete, or make the data complete using fairly ad-hoc methods; other methods do deal with missing data but learn only the conditional probabilities, assuming that the structure is known. We present a principled approach to learn both the Bayesian n...
متن کاملLearning Bayesia Networks from Incorn
Much of the current research in learning Bayesian Networks fails to effectively deal with missing data. Most of the methods assume that the data is complete, or make the data complete using fairly ad-hoc methods; other methods do deal with missing data but learn only the conditional probabilities, assuming that the structure is known. We present a principled approach to learn both the Bayesian ...
متن کاملAn Introduction to Inference and Learning in Bayesian Networks
Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...
متن کاملA Forgetting Model for Paired-Associate Learning
This paper presents a Markov model for paired-associate learning which is based on conclusion drawn from experimental studies of short-term memory. Learning of paired-associates is considered as a decrease in the probability of forgetting an association between trials. The ability of the model to account for experimental results is demonstrated by considering data from two experiments involving...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1999